Debiasing convex regularized estimators and interval estimation in linear models

نویسندگان

چکیده

New upper bounds are developed for the $L_2$ distance between $\xi/\text{Var}[\xi]^{1/2}$ and linear quadratic functions of $z\sim N(0,I_n)$ random variables form $\xi=bz^\top f(z) - \text{div} f(z)$. The approximation yields a central limit theorem when squared norm $f(z)$ dominates Frobenius $\nabla f(z)$ in expectation. Applications this normal given asymptotic normality de-biased estimators regression with correlated design convex penalty regime $p/n \le \gamma$ constant $\gamma\in(0,{\infty})$. For estimation $\langle a_0,\beta\rangle$ unknown coefficient vector $\beta$, analysis leads to estimate most normalized directions $a_0$, where ``most'' is quantified precise sense. This holds any if $\gamma<1$ strongly $\gamma\ge 1$. In particular needs not be separable or permutation invariant. By allowing arbitrary regularizers, results vastly broaden scope applicability de-biasing methodologies obtain confidence intervals high-dimensions. absence strong convexity $p>n$, obtained Lasso group under additional conditions. general penalties, our also provides prediction error independent interest.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Ridge Stochastic Restricted Estimators in Semiparametric Linear Measurement Error Models

In this article we consider the stochastic restricted ridge estimation in semipara-metric linear models when the covariates are measured with additive errors. The&nbsp;development of penalized corrected likelihood method in such model is the basis for derivation of ridge estimates. The asymptotic normality of the resulting&nbsp;estimates are established. Also, necessary and sufficient condition...

متن کامل

Robust Estimation in Linear Regression with Molticollinearity and Sparse Models

‎One of the factors affecting the statistical analysis of the data is the presence of outliers‎. ‎The methods which are not affected by the outliers are called robust methods‎. ‎Robust regression methods are robust estimation methods of regression model parameters in the presence of outliers‎. ‎Besides outliers‎, ‎the linear dependency of regressor variables‎, ‎which is called multicollinearity...

متن کامل

Linear and convex aggregation of density estimators

We study the problem of learning the best linear and convex combination of M estimators of a density with respect to the mean squared risk. We suggest aggregation procedures and we prove sharp oracle inequalities for their risks, i.e., oracle inequalities with leading constant 1. We also obtain lower bounds showing that these procedures attain optimal rates of aggregation. As an example, we con...

متن کامل

Piecewise Convex Function Estimation: Pilot Estimators

Given noisy data, function estimation is considered when the unknown function is known a priori to consist of a small number of regions where the function is either convex or concave. When the number of regions is unknown, the model selection problem is to determine the number of convexity change points. For kernel estimates in Gaussian noise, the number of false change points is evaluated as a...

متن کامل

Regularized Autoregressive Multiple Frequency Estimation

The paper addresses a problem of tracking multiple number of frequencies using Regularized Autoregressive (RAR) approximation. The RAR procedure allows to decrease approximation bias, comparing to other AR-based frequency detection methods, while still providing competitive variance of sample estimates. We show that the RAR estimates of multiple periodicities are consistent in probabilit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Annals of Statistics

سال: 2023

ISSN: ['0090-5364', '2168-8966']

DOI: https://doi.org/10.1214/22-aos2243